专利摘要:
The invention relates to a system for automatically inspecting a surface of an aircraft-type object (54), transport vehicle, building or structure, said surface being capable of presenting a defect. The system is characterized in that it comprises a fleet comprising at least one flying robot (14a, 14b, 14c), each flying robot comprising an image acquisition module of at least a portion of the surface to be inspected, and an acquired image processing module adapted to provide information representative of the state of each inspected surface portion, said result of the treatment.
公开号:FR3037429A1
申请号:FR1555452
申请日:2015-06-15
公开日:2016-12-16
发明作者:Matthieu Claybrough
申请人:Matthieu Claybrough;
IPC主号:
专利说明:

[0001] FIELD OF THE INVENTION The invention relates to a system and a method for automatically inspecting large objects. In particular, the invention relates to the detection and location of defects on difficult access surfaces of these large objects. BACKGROUND OF THE INVENTION The technical field of the invention relates to the detection and localization of defects visible to the human eye over large areas, such as the external surfaces of large objects, for example aircraft, ships, aircraft, trains, motor vehicles, buildings or works of art. In a general way, throughout the text, the term "object" refers to all concrete things perceptible by sight and touch, made by man and intended for a certain purpose. Large objects, that is to say objects whose dimensions, size and / or weight do not allow them to be worn by a human, generally have surfaces difficult to access, for example portions at the height of buildings or works of art, the outer hull of large ships, the structure of an oil rig or the upper part of trainsets or aircraft fuselages and wings. The detection and the localization of faults visible to the eye on this type of large objects thus present several problems, in particular problems of visual access to surfaces, detection of defects and localization of defects in a reference linked to the object. . The defects to be detected are for example an impact of lightning, hail, bird, debris, or defects of corrosion, erosion, paint dripping, cracks, etc.
[0002] The detection and localization of defects is carried out by an inspection of the surfaces of the objects. Several inspection systems and procedures have been proposed to address these issues. Generally, the current inspections are carried out by human operators. Specific equipment is used to allow visual access to surfaces by these operators, for example the use of aerial work platforms, scaffolding, etc. For the most difficult access surfaces, the operator may for example also have to use binoculars or equivalent optical means. Human operators are specifically trained to detect defects by inspecting surfaces. The detection of these defects is therefore based on the experience and the feeling of the operators. Once a fault has been detected, the operator is responsible for locating these defects, that is to say, to list the location of the fault detected absolutely or more commonly in a manner relative to one or more marks present on the object. . These markers can be for example according to the type of object inspected windows or portholes, structural elements such as cables, poles, columns, frames, rails, smooth, text markers, particular distinctive elements, etc. The location of a defect is thus performed by first determining at least one reference mark, and then measuring in a second time the position of the defect with respect to each reference mark. Such inspections by one or more human operators, however, have several disadvantages. Regarding visual access to surfaces, the installation of specific equipment such as scaffolding is long and expensive, and does not systematically allow easy access to the surfaces to be inspected. The use of binoculars or equivalent optical means to overcome this disadvantage is not satisfactory because it reduces the efficiency of the inspection. In addition, specific equipment generally entails increased safety risks for the operator, in particular risks of falling, crushing or any other risk due to the use of specific equipment such as nacelles or scaffolding. The specific equipment also entails risks for the object, in particular the risk of collision which may lead to damage. Depending on the fragility of the object, these degradations can lead to major repercussions, such as immobilization (for vehicles and aircraft), costly repairs or even a permanent shutdown. These disadvantages and risks are all the more important as the number of operators is high. The detection made by the operators is also imperfect because an operator may forget to visually scan a portion of the surface, especially if the surface is difficult to access. It can also unevenly treat surfaces depending on whether they are easily accessible or not. Finally, the subjectivity of each operator can lead to a different classification of the perceived elements (for example between serious defects, insignificant, normal wear or tasks), which can lead to ignore or detect late certain defects. In addition, the operators must be specifically trained for inspection, which reduces the number of operators likely to carry out the inspection and requires additional management of the availability and cost of a team of trained operators. Finally, localization techniques can lead to errors, especially in the choice and identification of reference marks, for example if this choice requires a count of a large number of repetitive elements (portholes, windows, columns). , etc.), to which are added the standard errors of measurements from the mark or marks. Inspections with existing systems and processes also have an additional problem of speed. Current inspections generally require stopping the operation of the object for a long time. To improve the speed of inspection, it is necessary to increase the number of operators in charge of this inspection, which generates additional costs and increases the risks mentioned above. Solutions have been proposed to overcome these disadvantages. For example, the use of a rolling robot equipped with optical means makes it possible to improve the detection and localization of defects and thus limit the subjectivity of the detection and the errors of localization. However, the problem of visual access is still present and the process is not fast, the intervention of an operator being necessary for each detection. Another solution is to place the object to inspect in a shed equipped with a plurality of cameras that can inspect the surface of the object. This system is nonetheless usable on buildings and engineering structures, is not movable and is not flexible. In particular, such a system requires bringing the object to be inspected, for example an aircraft, into the hangar which is expensive and complex.
[0003] The inventors have therefore sought to provide an inspection system and method which solves at least some of the disadvantages of known systems and methods. OBJECTIVES OF THE INVENTION The invention aims at overcoming at least some of the disadvantages of known surface inspection systems and methods.
[0004] In particular, the invention aims to provide, in at least one embodiment of the invention, a system and an inspection method for limiting the number of human operators required. The invention also aims to provide, in at least one embodiment, a system and an inspection method for inspecting difficult access surfaces without the need to implement specific equipment such as scaffolds. The invention also aims to provide, in at least one embodiment of the invention, a system and an inspection method for rapid inspection of a large area.
[0005] It is another object of the invention to provide, in at least one embodiment, a system and method of inspection for inspection by untrained or untrained human operators. The invention also aims to provide, in at least one embodiment, a system and an inspection method for better localization of defects on the surface. The invention also aims to provide, in at least one embodiment, a system and an inspection method for increased repeatability of inspections and defect detection. 4. DESCRIPTION OF THE INVENTION To this end, the invention relates to a system for automatically inspecting a surface of an object of the aircraft, transport vehicle, building or engineering structure type, said surface being capable of have a defect, characterized in that it comprises a fleet comprising at least one flying robot, each flying robot comprising: - an image acquisition module of at least a portion of the surface to be inspected, and - a image processing module acquired adapted to provide information representative of the state of each inspected surface portion, said result of the treatment. An automatic inspection system according to the invention therefore allows inspection by means of one or more flying robots (commonly called drones): the inspection system therefore greatly reduces the number of human operators required, because each robot Flying Fleet inspects a portion of the surface of the object to be inspected. The quantity of material needed is thus reduced, as well as the security risks for the operators.
[0006] In addition, the ability of flying robots to move in the air provides easy access to surface portions of the object difficult to access, for example the upper part of an aircraft or a train. These portions thus benefit from a quality inspection comparable to the portions that are easier to access. The use of a fleet of flying robots makes it possible to improve the speed of inspection, especially over very large areas, by using a number of flying robots adapted to the surface to be inspected. While the large size of the specific equipment such as nacelles and scaffolding of previous systems limited the number of operators that can perform an inspection simultaneously, the small footprint of the flying robots makes it possible to use a large number of these robots to carry out an inspection. faster inspection. In addition, the system is lightweight, easily transportable and therefore mobile, that is to say it can be moved to the object and does not require the movement of the object in a particular place. The processing of the data acquired by the acquisition module of the flying robot, and in particular the treatment of at least one image of a portion of the inspected surface, is performed in a processing module embedded in the flying robot, which accelerates inspection and detection of defects, limits the number of human operators required for inspection, and provides more consistent inspection and detection of the surface. The treatment makes it possible to provide information representative of the state of the surface inspected, and in particular to determine the presence of any defects on the surface: this determination is not subject to the subjectivity of a human operator, and allows therefore a greater constancy in the inspection. In addition, the inspection no longer requires human operators specifically trained for inspection. The information representative of the state of the surface inspected, provided by the processing module, is hereinafter referred to as the result of the treatment. The result of the treatment includes in particular the presence or absence of a potential defect on the portion of surface inspected. Thanks to the treatment by each flying robot, an untrained or untrained human operator will be able to become acquainted with the results of the treatment and focus only on portions of the surface having a potential defect, without the need to visually inspect the surface or to look at images of all surface portions. In addition, each flying robot can transmit only the results of the processing and not images of the entire surface, which reduces the amount of data transmitted throughout the system and allows the use of a larger number of data. flying robots to speed up the inspection. The processing load is thus distributed in each flying robot. Preferably, the flying robot is a robot-type helicopter or multi-rotor (usually quadrotor), able to perform a hover. This type of flying robot makes it possible to make easy take-offs and landings on a small surface, to move at a variable speed, in particular at a slow speed allowing better positioning accuracy and more safety, and is able to stop and change direction or even leave in the opposite direction if there is an obstacle in its path. The hover or slow speed also facilitates the acquisition of images and improve the quality of images acquired. Advantageously and according to the invention, the fleet comprises between one and ten 25 flying robots. Preferably, the fleet comprises three flying robots, allowing a good compromise between speed of execution, cost and reduction of the risks of collision between flying robots. Advantageously and according to the invention, the image acquisition module of at least one robot of said fleet comprises at least one camera adapted to acquire images in the spectrum of visible light.
[0007] According to this aspect of the invention, the camera allows the system to perform a visual inspection as would a human operator. The system performs image processing on an image in the visible light spectrum, referred to as the visible image, enabling the use of known, effective and proven techniques for processing visible images. Advantageously, an automatic inspection system according to the invention comprises a module for managing the fleet of robots, the management module being adapted to determine, from a model of the surface to be inspected, a set of instructions moving and image acquisition instructions to each robot in the fleet. According to this aspect of the invention, the management module allows a centralized programming of the inspection by the determination of instructions to each robot of the fleet, as a function, for example, of the number of robots available in the fleet, of the type of aircraft. the object to be inspected, the size of the surface to be inspected, the inspection time, etc. Each instruction determines a task to be executed by each robot. The fleet's robots are thus self-controlled by performing the tasks assigned to them. They therefore do not require human operators to control them, reducing the need for operator training, as well as the risks of pilot error that can lead, for example, to collisions between robots in the fleet or between a robot in the fleet. and the object. In addition, each robot can inspect without the need for visual contact between a human operator and each robot. Preferably, the management module determines the instructions so that the inspection is performed in a minimum time depending on the number of robots in the fleet. According to variants of the invention, the management module can be embedded in a robot of the fleet or included in an independent management device, for example a computer, or distributed between different robots of the fleet. Preferably, in the variants where the management module is included in an independent management device of the robots of the fleet, the management device comprises a man / machine interface allowing interaction with a human operator. Advantageously, a system according to the invention comprises a device for presenting the results of each treatment carried out by the processing module 5 of each robot of the fleet, and each robot of the fleet comprises a communication module adapted to transmit results of each treatment at the presentation device. According to this aspect of the invention, each flying robot transmits the results of the processing performed on the acquired images to the presentation device so that a human operator can interpret them. The results transmitted by each robot of the fleet being representative of the state of the surface inspected, they make it possible in particular to propose a classification of the potential detected faults, and for example to display on a screen of the presentation device the image acquired, the associated result and a classification of the potential fault, or to generate a report including the list of the potential detected faults. Advantageously, in an embodiment where the management module is included in a management device, the management device and the presentation device are arranged in a control device. The functions of the management module and the presentation device are then combined into one and the same device. Advantageously, the control device comprises a man / machine interface adapted to display a 3D model of the surface to be inspected and to display a representation of a real-time position of each robot of the fleet 25 with respect to the surface to be inspected. According to this aspect of the invention, the human / machine interface allows a human operator to see the position of each flying robot with respect to the object and its surface, to see the potential defects displayed on the 3D model, as well as that possibly to intervene on the flying robots if necessary, to make an emergency stop for example.
[0008] Advantageously and according to the invention, each robot of the fleet comprises a location module, adapted to associate with each treatment result a location of this result of the treatment relative to a reference relative to the surface to be inspected.
[0009] According to this aspect of the invention, the system allows a more precise localization of the results of the treatment, and therefore of the potential defects, than the localization by a human operator according to the prior art. The location of the results is determined by each robot of the fleet as a function of the location of said robot and of the parameters of the image acquisition module at the moment of acquisition of the image.
[0010] According to a variant of the invention, the image processing module makes it possible to recognize elements of the object whose location is known, thus making it possible to refine the location of the results. If the inspection system includes a presentation device, it is adapted to present each result and the location associated with said result.
[0011] Advantageously and according to the invention, each robot of the fleet comprises an emergency module adapted to detect a robot failure and, from a set of emergency tasks determined according to a robot position of the robot. Fleet relative to the object, said robot fleet is capable of performing at least one emergency task in case of failure. According to this aspect of the invention, an emergency maneuver is permanently determined, and the robot will perform this emergency maneuver in case of failure, for example loss of connection with a management module, failure of a location module, failure engine, etc. The emergency maneuvers prevent the degradation of the object whose surface is inspected, especially when this surface is fragile (aircraft for example). Emergency tasks are generally intended to move away from the object, and depend on the location of the robot in relation to the object. The set of emergency tasks is determined, according to several variants of the invention, either by the management module or by each robot. In the case where each robot 30 determines the set of emergency tasks, each robot transmits all the tasks to the management module so that it can determine if a robot may enter a collision with another of the robot. do these tasks. Advantageously and according to the invention, each robot of the fleet comprises a buffer memory module adapted to store a plurality of processing results.
[0012] According to this aspect of the invention, the results of the processing may be stored pending transmission to an external backup system, for example included in the presentation device. In addition, in the event of a robot failure, the results stored in the buffer module can be retrieved manually if they have not been transmitted.
[0013] Advantageously and according to the invention, each robot of the fleet comprises an obstacle detection module, each robot of the fleet being adapted to perform a task of avoiding at least one obstacle detected by the detection module of the vehicle. obstacle.
[0014] According to this aspect of the invention, each flying robot is adapted to modify its displacement in the event of obstacle detection. Preferably, if an obstacle detection module detects an obstacle, each robot of the fleet is adapted to transmit the position of said obstacle to the other robots in the fleet. Thus, the information concerning the position of the obstacle is shared and the robots of the fleet can intervene accordingly, by modifying their trajectory for example. Advantageously and according to the invention, the fleet comprises at least one rolling robot, each rolling robot comprising: an image acquisition module of at least a portion of the surface to be inspected, and a processing module of the images acquired adapted to provide information representative of the state of each portion of the inspected surface, said result of the treatment.
[0015] According to this aspect of the invention, the fleet comprising at least one flying robot can be supplemented by a mobile robot comprising the same modules, so as to access areas that are difficult to access to flying robots, for example under the fuselage of an aircraft. The invention also relates to a method of using an automatic inspection system 5 according to the invention, characterized in that it comprises: a step of determination by the management module of a set of instructions allocated to each robot of the fleet, a task execution step by each robot of the fleet, said tasks comprising at least one acquisition by the acquisition module 10 of an image of a portion of the surface to be inspected, and comprising at least one treatment by the processing module of said image so as to detect a potential fault on the surface to be inspected, a step of transmitting the result of said processing of each robot to a presentation device, a step of presenting said result of the treatment to a human operator. A method of use according to the invention therefore allows the use of an inspection system according to the invention by distributing tasks to each robot of the fleet according to the model of the surface to be inspected so as to optimize the speed inspection 20 of the surface. For example, each robot has a part of the surface to inspect. The step of presenting the result transmitted by one of the robots of the fleet is for example a display on a screen, or the generation of a report, etc. Advantageously and according to the invention, the step of transmitting the result of said processing by each robot is performed after each treatment by a processing module. According to this aspect of the invention, a result is transmitted directly after each treatment, thus making it possible to obtain as and when information about the presence of potential defects, without waiting for the end of the inspection. The transmission is carried out as soon as possible, that is to say taking into account the different processing times and the availability of a transmission channel assigned to the transmission if it is shared between several flying robots.
[0016] The invention also relates to a method for automatically inspecting a surface of an object of the aircraft, transport vehicle, building or engineering structure type, said surface being capable of presenting a defect, characterized in that it comprises: a step of acquiring images of at least a portion of the surface to be inspected by each flying robot of a fleet of robots comprising at least one flying robot, and a step of processing the images acquired for provide information representative of the state of each inspected surface portion, the result of the treatment. Advantageously, the inspection method according to the invention is implemented by the inspection system according to the invention.
[0017] Advantageously, the inspection system according to the invention implements the inspection method according to the invention. The invention also relates to an automatic inspection system, a method of using the system and an automatic inspection method characterized in combination by all or some of the features mentioned above or hereafter. 5. List of Figures Other objects, features and advantages of the invention will appear on reading the following description given solely by way of non-limiting example and which refers to the appended figures in which: FIG. 1 is a diagrammatic view of the automatic inspection system according to one embodiment of the invention, - Figure 2 is a schematic view of a robot of a fleet of an automatic inspection system according to one embodiment of the invention, FIG. 3 is a schematic view of the automatic inspection method according to one embodiment of the invention; FIG. 4 is a schematic view of a fleet of robots of an automatic inspection system according to a Embodiment of the Invention in Which the Object is an Aircraft, 6. Detailed Description of an Embodiment of the Invention The following embodiments are examples. Although the description refers to one or more embodiments, this does not necessarily mean that each reference relates to the same embodiment, or that the features apply only to a single embodiment. Simple features of different embodiments may also be combined to provide other embodiments. Figures, scales and proportions are not strictly adhered to for the purpose of illustration and clarity. FIG. 1 schematically represents a system 10 for automatically inspecting a surface of an object of the aircraft type, transport vehicle (rail vehicle, motor vehicle, etc.), building, structure, or any other object of 15 large dimensions and whose surface to be inspected is large, according to one embodiment of the invention. The purpose of the inspection system is to detect potential defects on the surface of the object. The inspection system 10 comprises a fleet 12 comprising at least one flying robot, here three robots 14a, 14b, 14c flying. The fleet 12 may also include one or more robots of another type, for example a robot 16 rolling, or any other robot adapted to the object to be inspected, for example an underwater robot for inspection of a flat oilfield. The robots 14a, 14b, 14c flying are commonly called drones or unmanned aircraft (UAV for Unmanned Aerial Vehicle in English) and are helicopter type, quadrotor or multirotor capable of hovering. To reduce the risk of damage to the object in the event of a collision with a fleet robot 12, they are equipped with bumpers. Each robot 14a, 14b, 14c, 16 of the fleet 12 is adapted to communicate with on the one hand a management module 18 and on the other hand with a presentation device 20. In another embodiment not shown, the management module 18 is embedded in one of the robots of the fleet 12.
[0018] The management module 18 and the presentation device 20 are connected, in this embodiment, to a web server 22 accessible via a telecommunication network. In addition, the management module 18 and the presentation device 20 can be embedded in the same control device, for example a computer or a tablet. A human operator 24 can interact via one or more man / machine interfaces with the management module 18 and the presentation device 20. The interface contains an interactive 3D visualization including a 3D model of the object to be inspected as well as the real-time position of the robots 14a, 14b, 14c, 16 and potential potential defects found.
[0019] FIG. 2 schematically represents a robot 14 of the fleet 12 according to one embodiment of the invention. The robot 14 includes a module 26 for acquiring images of the surface to be inspected and a module 28 for processing the images acquired. The image acquisition module 26 comprises at least one sensor, for example a camera 15 for acquiring images in the spectrum of visible light. To improve the quality of the image acquired, the acquisition module may also include a lighting device in the visible light spectrum. Furthermore, in this embodiment of the invention, the robot 14 comprises: - an emergency module 30, adapted to detect faults, - a communication module 32 with the management module 18 and the device 20 of the invention. presentation, - a module 34 for controlling the robot, adapted to process robot control instructions from the management module 18 or the other modules embedded in the robot 14, 25 - a location module 36, - a stabilization module 38 and guiding the robot, in particular controlling the motors of the robot 14 as a function of displacement commands transmitted by the control module 34, a buffer memory module 40, an obstacle detection module 42. These modules are present for example in the form of electronic components, several modules can be combined in the same electronic component and a module can be composed of a plurality of interacting electronic components. The modules can also be implemented as a computer program executed by one or more electronic components, for example a processor of a computer, microcontroller, DSP (Digital Signal Processor), FPGA (Field Gate Programmable Array). ), etc. Fig. 3 shows a method of automatically inspecting the surface of the object according to one embodiment of the invention. The method is implemented by the automatic inspection system 10 according to the previously described embodiment. The first step of the method is a step 44 of parameterization of the inspection by the human operator 24 on the management module, via a man / machine interface. The human operator 24 has access to several parameters, for example: the object to be inspected: the human operator chooses a model, for example a 3D model, from the surface of the object to be inspected from a selection of predefined templates, accessible for example on the web server 22; object environment: the human operator can indicate whether the object is located outdoors, in a shed, etc. to determine if the robots are subject to particular constraints on their movement around the object (eg obstacles); - Determination of the fleet 12: number of robots used and their type (steering wheel or not, equipped with certain types of acquisition modules or sensors, etc.) - mission to perform: fast inspection, thorough, partial, etc. Once the parameters are validated by the human operator, the inspection process proceeds to a step 46 of determining a set of instructions. This determination step 46 is executed by the management module 18. It consists in determining, according to the parameters chosen by the human operator during the preceding step and in particular the model of the surface to be inspected, a set of moving instructions and instructions of acquisition of images which are attributed to each robot of the fleet 12. The inspection is thus broken down into different tasks which are translated into instructions for the robots of the fleet 12, to cover the entire surface to be inspected according to the model of said surface. These instructions are for example a displacement of a robot flying from a point A to a point B, an image acquisition of the surface of the object at point B, a processing of said image, a displacement to a new point C, a new image acquisition, etc.
[0020] The instructions assigned to each robot of the fleet 12 are transmitted to said robots by the management module 18 to allow the execution of the tasks related to these instructions in a step 48 of execution of the tasks by each robot of the fleet 12. In the embodiments, each robot receives the entirety of the instructions assigned to it before the execution step 48, or only a first part of the instructions, the following instructions being sent during the execution step 48. In both cases, the management module 18 can modify the tasks that are being executed by sending new instructions in the event of a change of situation, for example, assigning a new surface portion to a robot if the robot first has to acquire an image of this surface portion is broken down. The fleet robots 12 20 can also transmit information to the management module 18 concerning their status, the progress of the execution of the tasks and any other information that may lead to the sending of new instructions by the management module 18. The tasks of each robot of the fleet 12 are processed by the control module 34 of said robot. In particular, the control module 34 stores the instructions received, their sequencing, the calculations related to these instructions to determine the associated tasks, the control of the various modules according to the tasks, the calculation of the status of the robot, etc. During the execution step 48, each robot of the fleet 12 performs at least one acquisition of an image via its acquisition module 26 and an image processing 30 acquired via its processing module 28. The acquisition of the image is performed by one or more sensors of the acquisition module, making it possible to obtain different types of image depending on the sensor used. For example, the sensors may be infrared sensors, cameras for the visible spectrum, ultraviolet sensors, or any other sensor for forming an electromagnetic or acoustic wave image in a frequency band.
[0021] The sensor may also be a 3D sensor, of the depth sensor type, flight time depth sensor (TOF for Time of Flight in English) or projection of infrared pattern, stereoscopic sensor, etc. Finally, a sensor can acquire images of the same portion over several frequency spectra (hyperspectral imagery). The processing of the image by the processing module 28 is to provide a result representative of the condition of the inspected surface. The processing module 28 thus determines, from the acquired image, the presence of a potential defect on the surface, for example by comparing the acquired image with an older image of the same surface (retrieved from the web server or provided by the management module), detection of sudden variation of colors, aspects (fineness, grain, blur, brightness, etc.), etc. The processing module 28 uses predefined and preset algorithms. For example, according to one embodiment of the invention, the processing module 28 implements the following steps for each image: a first image normalization step of applying a first set of digital filters set; according to image parameters to compensate for external variations (lighting, etc.), to reduce the effect of disturbances (reflections, etc.), and to compensate for the deformations induced by the lens (geometry and lighting) A second step of locating the image which, from a positioning of the robot (ie the position of the robot in the space and the orientation angle of the robot), of the position and the angle of the acquisition sensor of the acquisition module 26, for example a camera, and the distance from the camera to the surface, determines the coordinates in a relative reference to the surface of the set of points of the camera. the image - a third stage of segmentation the image and the extraction of contours of all forms that may be potential defects, and the generation of a sub-image containing the default potential, also called vignette, for each of these forms. An image can lead to the generation of no thumbnails or thumbnails.
[0022] The processing module then calculates, for each thumbnail of the image, a set of parameters according to pre-recorded algorithms and then classifies and characterizes the thumbnails from this set of parameters. Algorithms that do not require data other than the thumbnail and whose calculated parameters are called descriptors can be distinguished. For each parameter, the following steps are performed: application of a digital filter, calculation of the parameter on the filtered vignette. The filter is chosen according to the desired parameter. For example, the digital filter is a Gaussian noise reduction filter, a gradient type filter for detecting sudden changes, a color filter for calculating the descriptors on certain frequency combinations only, and a frequency filter for detecting certain patterns. repetitions or textures. For example, several families of descriptors are used: geometric descriptors (perimeter, larger dimension, smaller dimension, width / height ratio, number of breaks on the contour, average contour curvature, etc.) descriptors directly related to pixels: statistical moments (mean, variance, asymmetry, kurtosis, etc.) and other mathematical operators (maximum, order difference, entropy, uniformity, etc.). The descriptor can also be applied to a subset of the pixels that respect a particular criterion, for example a value greater or less than a predetermined threshold. As regards the use of other external data to calculate the parameter, the algorithm comprises for example the following steps: determining a reference thumbnail by extracting from a reference image the same surface area as the thumbnail processed. The image of 3037429 19 reference may be an image of this area taken at an earlier date (available for example on the web server 22) or an image generated by a computer from the surface model. According to a preferred embodiment, several images can be used, the parameters are then calculated for each reference image, calculation of parameters expressing the difference between the acquired image and each reference image, these parameters being for example mathematical standards on the difference between the thumbnail and the reference thumbnail, correlation indexes, histogram comparison, etc. These methods are generally implemented locally, around points of interest of the vignette. The last step is the classification and characterization of the vignette from the set of calculated parameters. The classification consists in determining the type of the potential defect, for example among the following categories: "oil stain", "corrosion", "missing element", "lightning strike", "scratch", "no fault", "Unknown", etc. The characterization consists in determining a category of the sticker from a predetermined set, for example: "acceptable defect, unacceptable defect", as well as the size of said potential defect.
[0023] Classification and characterization may be performed by a known classifier such as linear classifier, naive Bayesian classifier, SVM classifier also called "Support Vector Machine", neural networks, etc. In this embodiment, all thumbnails classified as defective or unknown, together with their location, classification, and characterization, form processing results. According to an advantageous embodiment, the results can be transmitted on the web server which has the ability to learn, that is to say to improve its algorithms and settings as the results. This web server is then able to send new and more precise settings to the module 28 for processing the images acquired on the one hand and, on the other hand, to remove the doubt on the results classified with a bad confidence index or classified in a category "unknown".
[0024] The result of this treatment is the detection or not of a potential defect, and possibly a classification of the potential defect according to its gravity. The result of each treatment is stored in the buffer module of the robot 14.
[0025] Each result is associated with a location provided by the location module 36. This location is expressed according to a reference relative to the surface to be inspected so as to be easily retrievable by a human operator. The location module 36 makes it possible to determine the positioning of the robot 14, and to deduce the location of this result with respect to this positioning. The positioning of the robot 14 is determined by one or more absolute location equipment, for example a GPS (Global Positioning System in English), one or more inertial location equipment, for example by an accelerometer, a gyroscope, a magnetometer, etc. and / or one or more relative location equipment, for example radar, ultrasound, laser rangefinder, infrared, image processing, with respect to ground beacons, etc., or a combination thereof. The location of the result is then determined with respect to the positioning of the robot, by the image processing module, as explained above. The positioning of the robot and the location of the result can use a combination of different technologies, which are then for example associated by hybridization via a Kalman filter, allowing a more precise location of the result. Each result of each processing and the location of said result are transmitted by the communication module 32 to the presentation device 20 during a step 50 of transmission of the result. Depending on the embodiments, the result may be transmitted at the end of the inspection or permanently during the inspection.
[0026] When a result is transmitted, the presentation device 50 sends an acknowledgment to the robot 14 which suppresses the result of its buffer module. The result is then presented to a human operator through the presentation device 20 during a presentation step 52. The presentation of the results can take place in several forms and through man / machine interfaces of different types, for example comprising a 3D representation of the model used in the determination of the instructions, on which the potential is placed. default, a display of the image of the surface associated with the result on a screen, the generation of a written report, etc. The report includes, for example, a list of the potential defects detected, their location, their classification (type of defect), and their characterization (size and severity of the defect). The operator can then restart an automatic inspection process with new parameters, for example to inspect more precisely or with new sensors the portions where potential defects have been detected. Each result can also be saved to create a history of inspections on the same object. This history may be transmitted to the web server for future use, possibly in a different environment (for example for aircraft whose inspection may take place in different locations), or for additional processing. Certain situations may cause each robot in the fleet 12 to perform tasks different from those initially provided for in the instructions from the management module. For example, the obstacle detection module 42 enables obstacle detection and task transmission to the robot control module 34 which performs these tasks so as to avoid the obstacle, and possibly signal it to the operator 24. humanly via the presentation device 20, as well as the management module 18 so that it can modify the movements of the other robots of the fleet 12 if necessary and / or send new instructions to the robot having detected the obstacle. The robot can also directly inform other robots. The emergency module 30 also makes it possible to transmit emergency tasks to the control module 34 if a failure reaches the flying robot. The emergency module 25 allows fault detection. A set of emergency tasks for each expected failure case adapted to the position of the robot with respect to the object is determined either by the emergency module 30 or by the management module 18. For example, in the case of a fixed-wing aircraft, a robot above a wing will deviate laterally before landing vertically while a robot under the aircraft 30 will land directly.
[0027] FIG. 4 diagrammatically represents a fleet of robots of an automatic inspection system according to one embodiment of the invention implementing an automatic inspection method according to one embodiment of the invention, in which The object is an aircraft 54. Three flying robots 14a, 14b, 14c are shown.
[0028] The step of determining a set of instructions makes it possible, for example, to assign to each robot of the fleet, via instructions for moving and acquiring images, tasks relating to a part of the surface to be inspected. . For example, in the embodiment shown, a first flying robot 14a inspects the surface of the front 58 of the fuselage of the aircraft 54, a second 14b flying robot 10 inspects the surface of the empennage 56 of the aircraft 54 and a third flying robot 14c inspects the surface of a portion 57 of a wing of the aircraft 54. The robots 14a, 14b, 14c flying communicate through the module 32a, 32b, 32c communication via a wireless transmission with a control device 60, comprising the management module and the presentation device, with which a human operator interacts to follow the progress of the inspection and possibly see potential defects detected by the robots 14a, 14b, 14c flying. The wireless transmission takes place via one or more known communication protocols, for example Zigbee (IEEE 802.15.4) for control transmissions, Wi-Fi (IEEE 802.11) for data transmissions, and possibly a protocol. different radio (eg DSM2 / DSMX type in the 2.4GHz band) for emergency transmissions.
权利要求:
Claims (13)
[0001]
REVENDICATIONS1. System for automatically inspecting a surface of an aircraft-type object (54), transport vehicle, building or structure, said surface being liable to present a defect, characterized in that it comprises a fleet ( 12) comprising at least one flying robot (14, 14a, 14b, 14c), each flying robot comprising: - a module (26) for acquiring images of at least a portion of the surface to be inspected, and - a image acquisition module (28) adapted to provide information representative of the state of each inspected surface portion, said result of the treatment.
[0002]
2. Automatic inspection system according to claim 1, characterized in that the module (26) for acquiring images of at least one robot of said fleet comprises at least one camera adapted to acquire images in the spectrum of visible light.
[0003]
3. Automatic inspection system according to one of claims 1 or 2, characterized in that it comprises a module (18) for managing the fleet (12) robots, the management module (18) being adapted for determining, from a model of the surface to be inspected, a set of moving instructions and image acquisition instructions to each robot (14, 14a, 14b, 14c, 16) of the fleet (12 ).
[0004]
4. Automatic inspection system according to one of claims 1 to 3, characterized in that it comprises a device (20) for presenting the results of each treatment performed by each module (28) for processing each robot (14). , 14a, 14b, 14c, 16) of the fleet (12), and in that each robot (14, 14a, 14b, 14c, 16) of the fleet (12) comprises a communication module (32) adapted to transmit results of each treatment to the presentation device (20).
[0005]
5. An automatic inspection system according to claims 3 and 4 taken together, characterized in that the management module (18) and the presentation device (20) are arranged in a control device (60).
[0006]
An automatic inspection system according to claim 5, characterized in that the control device (60) comprises a man / machine interface adapted to display a 3D model of the surface to be inspected and to display a representation of a position in real time of each robot of the fleet relative to the surface to be inspected. 10
[0007]
7. automatic inspection system according to one of claims 1 to 6, characterized in that each robot of the fleet comprises a location module adapted to associate with each treatment result a location of this treatment result with respect to a reference relative to the surface to be inspected. 15
[0008]
8. Automatic inspection system according to one of claims 1 to 7, characterized in that each robot (14, 14a, 14b, 14c, 16) of the fleet (12) comprises a module (30) emergency adapted to detect a robot failure and, from a set of emergency tasks determined according to a position of the robot of the fleet (12) with respect to the object, said robot of the fleet (12) is able to perform at least 20 emergency task in case of failure.
[0009]
9. Automatic inspection system according to one of claims 1 to 8, characterized in that each robot (14, 14a, 14b, 14c, 16) of the fleet (12) comprises an obstacle detection module, each robot (14, 14a, 14b, 14c, 16) of the fleet (12) being adapted to perform a task of avoiding at least one obstacle detected by the obstacle detection module.
[0010]
10. Automatic inspection system according to one of claims 1 to 9, characterized in that the fleet (12) comprises at least one robot (16) rolling, each robot (16) rolling comprising: - a module (26) ) acquiring images of at least a portion of the surface to be inspected, and - an acquired image processing module (28) adapted to provide information representative of the state of each portion of the inspected surface. , called result of the treatment. 5
[0011]
11. A method of using an automatic inspection system according to one of claims 1 to 10, characterized in that it comprises: - a step of determination by the module (18) management a set of instructions assigned to each robot (14, 14a, 14b, 14c, 10 16) of the fleet (12), - a task execution step by each robot (14, 14a, 14b, 14c, 16) of the fleet (12), said tasks comprising at least one acquisition by the image acquisition module (26) of a portion of the surface to be inspected, and comprising at least one treatment by the treatment module (28). said image so as to detect a potential fault on the surface to be inspected, - a step of transmitting the result of said processing of each robot to a presentation device (20), - a step of presenting said result of the processing to an operator ( 24) human.
[0012]
12. The method of use according to claim 11, characterized in that the step of transmitting the result of said processing by each robot is performed after each treatment by a module (28) of treatment. 25
[0013]
13. A method of automatically inspecting a surface of an aircraft-type object (54), transport vehicle, building or structure, said surface being capable of presenting a defect, characterized in that it comprises: a step of acquiring images of at least a portion of the surface to be inspected by each flying robot (14, 14a, 14b, 14c) of a fleet (12) of robots comprising at least one flying robot, and a step of processing the acquired images to provide information representative of the state of each surface portion inspected, said result of the treatment.
类似技术:
公开号 | 公开日 | 专利标题
EP3308232B1|2022-01-05|System and method for automatically inspecting surfaces
US10618168B2|2020-04-14|Robot system path planning for asset health management
US10778967B2|2020-09-15|Systems and methods for improving performance of a robotic vehicle by managing on-board camera defects
US20190068829A1|2019-02-28|Systems and Methods for Improving Performance of a Robotic Vehicle by Managing On-board Camera Obstructions
EP1335258A1|2003-08-13|Method for guiding an aircraft in a final landing phase and corresponding device
EP3270365A1|2018-01-17|A device for assisting the piloting of a rotorcraft, an associated display, and a corresponding method of assisting piloting
FR3003380A1|2014-09-19|METHOD FOR MONITORING THE VEGETATION STATE AT THE ABORDIES OF AN INFRASTRUCTURE
FR3026540A1|2016-04-01|SYSTEM AND METHOD FOR MASKING DYNAMIC IMAGE
EP3388914A1|2018-10-17|Target tracking method performed by a drone, related computer program, electronic system and drone
Bonnin-Pascual et al.2019|On the use of robots and vision technologies for the inspection of vessels: A survey on recent advances
EP3361345A1|2018-08-15|A system and a method for assisting landing an aircraft, and a corresponding aircraft
FR3003356A1|2014-09-19|METHOD FOR OBSERVING A ZONE USING A DRONE
CN107885231B|2020-12-29|Unmanned aerial vehicle capturing method and system based on visible light image recognition
WO2018229391A1|2018-12-20|Platform for controlling and tracking inspections of surfaces of objects by inspection robots and inspection system implementing such a platform
EP0604252B1|1997-05-14|Method for aiding piloting of a low flying aircraft
CA3020086A1|2019-04-20|Control process for alert restitution and/or system reconfiguration procedures, associated computer program and control systems
CA3048013C|2021-11-09|Process and drone equipped with a landing/take off assistance system
FR3088308A1|2020-05-15|METHOD FOR MEASURING A LEVEL OF WEAR OF A VEHICLE TIRE.
US11170524B1|2021-11-09|Inpainting image feeds of operating vehicles
EP3798675B1|2021-11-10|Method and system for detecting wired obstacles for aircraft
EP3866136A1|2021-08-18|Method and system to assist with navigation for an aircraft by detecting maritime objects in order to implement an approach flight, hovering or landing
FR3077393A1|2019-08-02|Aerial vehicles with artificial vision
FR3103047A1|2021-05-14|ARTIFICIAL NEURON NETWORK LEARNING PROCESS AND DEVICE FOR AIRCRAFT LANDING ASSISTANCE
WO2022002531A1|2022-01-06|System and method for detecting an obstacle in an area surrounding a motor vehicle
FR3084485A1|2020-01-31|MOTORIZED FLYING MACHINE FOR MEASURING THE RELIEF OF SURFACES OF A PREDETERMINED OBJECT AND METHOD FOR CONTROLLING SUCH A MACHINE
同族专利:
公开号 | 公开日
EP3308232A1|2018-04-18|
CN107709158A|2018-02-16|
WO2016203151A1|2016-12-22|
JP2018521406A|2018-08-02|
SG11201709768YA|2017-12-28|
US20180170540A1|2018-06-21|
FR3037429B1|2018-09-07|
MA42512A|2018-06-06|
KR20180029973A|2018-03-21|
BR112017026528A2|2018-08-14|
EP3308232B1|2022-01-05|
US10377485B2|2019-08-13|
CA2989154A1|2016-12-22|
IL255814D0|2018-01-31|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
US20100103260A1|2008-10-27|2010-04-29|Williams Scot I|Wind turbine inspection|
US20140278221A1|2013-03-12|2014-09-18|The Boeing Company|Self-Contained Holonomic Tracking Method and Apparatus for Non-Destructive Inspection|
WO2015059241A1|2013-10-24|2015-04-30|Airbus Group Sas|Collaborative robot for visually inspecting an aircraft|WO2018213860A1|2017-05-23|2018-11-29|Ars Electronica Linz Gmbh & Co Kg|System for controlling unmanned aircraft in a swarm, in order to film a moving object with multiple cameras|
WO2018229391A1|2017-06-15|2018-12-20|Donecle|Platform for controlling and tracking inspections of surfaces of objects by inspection robots and inspection system implementing such a platform|JPH03255302A|1990-03-05|1991-11-14|Mitsubishi Electric Corp|Detecting apparatus for pattern|
US6907799B2|2001-11-13|2005-06-21|Bae Systems Advanced Technologies, Inc.|Apparatus and method for non-destructive inspection of large structures|
JP4475632B2|2004-03-19|2010-06-09|中国電力株式会社|Transmission line inspection system using unmanned air vehicle|
JP2006027448A|2004-07-16|2006-02-02|Chugoku Electric Power Co Inc:The|Aerial photographing method and device using unmanned flying body|
CN1305194C|2004-12-17|2007-03-14|华北电力大学(北京)|Power circuit scanning test robot airplane and controlling system|
US8060270B2|2008-02-29|2011-11-15|The Boeing Company|System and method for inspection of structures and objects by swarm of remote unmanned vehicles|
US8812154B2|2009-03-16|2014-08-19|The Boeing Company|Autonomous inspection and maintenance|
CN101561683B|2009-04-01|2010-08-11|东南大学|Motion control device of robot for detecting environmental pollution|
US8982207B2|2010-10-04|2015-03-17|The Boeing Company|Automated visual inspection system|
TW201215442A|2010-10-06|2012-04-16|Hon Hai Prec Ind Co Ltd|Unmanned Aerial Vehicle control system and method|
DE102011017564B4|2011-04-26|2017-02-16|Airbus Defence and Space GmbH|Method and system for inspecting a surface for material defects|
EP2527649B1|2011-05-25|2013-12-18|Siemens Aktiengesellschaft|Method to inspect components of a wind turbine|
CN102355569A|2011-05-30|2012-02-15|南京航空航天大学|Aircraft skin structure monitoring method based on wireless machine vision|
CN102354174B|2011-07-30|2012-12-26|山东电力研究院|Inspection system based on mobile inspection apparatus of transformer station and inspection method thereof|
CN102280826B|2011-07-30|2013-11-20|山东鲁能智能技术有限公司|Intelligent robot inspection system and intelligent robot inspection method for transformer station|
CN102510011B|2011-10-24|2014-10-29|华北电力大学|Method for realizing the intelligent tour-inspection of power tower based on miniature multi-rotor unmanned helicopter|
SG11201402114PA|2011-11-09|2014-06-27|Abyssal S A|System and method of operation for remotely operated vehicles with superimposed 3d imagery|
US8833169B2|2011-12-09|2014-09-16|General Electric Company|System and method for inspection of a part with dual multi-axis robotic devices|
CN102566576B|2012-02-24|2014-03-19|山东鲁能智能技术有限公司|Multiple inspection robot cooperative operation method for substation sequence control system|
US8855442B2|2012-04-30|2014-10-07|Yuri Owechko|Image registration of multimodal data using 3D-GeoArcs|
US9651950B2|2012-07-18|2017-05-16|The Boeing Company|Mission re-planning for coordinated multivehicle task allocation|
CN102941920A|2012-12-05|2013-02-27|南京理工大学|High-tension transmission line inspection robot based on multi-rotor aircraft and method using robot|
US20140336928A1|2013-05-10|2014-11-13|Michael L. Scott|System and Method of Automated Civil Infrastructure Metrology for Inspection, Analysis, and Information Modeling|
CN103941746B|2014-03-29|2016-06-01|国家电网公司|Image processing system and method is patrolled and examined without man-machine|
CN104199455A|2014-08-27|2014-12-10|中国科学院自动化研究所|Multi-rotor craft based tunnel inspection system|
JP2018506127A|2014-11-24|2018-03-01|キトフ システムズ エルティーディー.|Automatic inspection method|
CN104538899A|2015-01-19|2015-04-22|中兴长天信息技术有限公司|Wireless-transmission-based unmanned aerial vehicle platform for power line inspection|
ITUA20161534A1|2016-03-10|2017-09-10|Wpweb Srl|PROCEDURE FOR ANALYZING AN AIRCRAFT, ITS ANALYSIS SYSTEM OF A AIRCRAFT AND ANTI-FREEZE AND CRUSHING SYSTEM|SG10202108173YA|2015-08-17|2021-09-29|H3 Dynamics Holdings Pte Ltd|Drone box|
EP3173979A1|2015-11-30|2017-05-31|Delphi Technologies, Inc.|Method for identification of characteristic points of a calibration pattern within a set of candidate points in an image of the calibration pattern|
EP3174007A1|2015-11-30|2017-05-31|Delphi Technologies, Inc.|Method for calibrating the orientation of a camera mounted to a vehicle|
US10170011B2|2016-07-26|2019-01-01|International Business Machines Corporation|Guide drones for airplanes on the ground|
US10820574B2|2016-07-29|2020-11-03|International Business Machines Corporation|Specialized contextual drones for virtual fences|
US9987971B2|2016-07-29|2018-06-05|International Business Machines Corporation|Drone-enhanced vehicle external lights|
US11107030B2|2016-09-28|2021-08-31|Federal Express Corporation|Enhanced systems, apparatus, and methods for positioning of an airborne relocatable communication hub supporting a plurality of wireless devices|
US20180114302A1|2016-10-23|2018-04-26|The Boeing Company|Lightning strike inconsistency aircraft dispatch mobile disposition tool|
US11145043B2|2016-10-24|2021-10-12|Ford Motor Company|Using unmanned aerial vehicles to inspect autonomous vehicles|
US10825097B1|2016-12-23|2020-11-03|State Farm Mutual Automobile Insurance Company|Systems and methods for utilizing machine-assisted vehicle inspection to identify insurance buildup or fraud|
US10521960B2|2017-05-03|2019-12-31|General Electric Company|System and method for generating three-dimensional robotic inspection plan|
US10235892B1|2017-05-05|2019-03-19|Architecture Technology Corporation|Aircraft surface state event track system and method|
US10682677B2|2017-05-10|2020-06-16|General Electric Company|System and method providing situational awareness for autonomous asset inspection robot monitor|
GB2565757A|2017-07-13|2019-02-27|Sita Information Networking Computing Uk Ltd|Database of Drone flight plans for aircraft inspection using relative mapping|
US20190047577A1|2017-08-09|2019-02-14|Walmart Apollo, Llc|System and method for automated vehicle breakdown recovery|
JP6918672B2|2017-10-11|2021-08-11|株式会社日立システムズ|Deterioration diagnosis system|
CN108408082A|2018-02-11|2018-08-17|西安航空学院|A kind of unmanned plane and its operating method for big aircraft vertical fin crack detection|
CN110197348B|2018-02-24|2021-11-19|北京图森智途科技有限公司|Autonomous vehicle control method and autonomous vehicle control device|
EP3534334A1|2018-02-28|2019-09-04|Aptiv Technologies Limited|Method for identification of characteristic points of a calibration pattern within a set of candidate points derived from an image of the calibration pattern|
CN108508916B|2018-04-02|2021-05-07|南方科技大学|Control method, device and equipment for unmanned aerial vehicle formation and storage medium|
US10533840B2|2018-04-24|2020-01-14|Gulfstream Aerospace Corporation|Method for characterizing shape changes of an aircraft due to flight loads|
CN108985163A|2018-06-11|2018-12-11|视海博(中山)科技股份有限公司|The safe detection method of restricted clearance based on unmanned plane|
CN108872130B|2018-06-25|2019-08-23|北京空间飞行器总体设计部|Typical aircraft Facing material recognition methods neural network based|
WO2020003818A1|2018-06-28|2020-01-02|パナソニックIpマネジメント株式会社|Inspection instrument and inspection method|
WO2020008344A1|2018-07-04|2020-01-09|Hus Unmanned Systems Pte. Ltd.|Defect detection system using a camera equipped uav for building facades on complex asset geometry with optimal automatic obstacle deconflicted flightpath|
FR3084485A1|2018-07-26|2020-01-31|Donecle|MOTORIZED FLYING MACHINE FOR MEASURING THE RELIEF OF SURFACES OF A PREDETERMINED OBJECT AND METHOD FOR CONTROLLING SUCH A MACHINE|
CN109352621A|2018-10-19|2019-02-19|飞码机器人私人有限公司|A kind of construction quality detection robot system and method|
EP3966728A1|2019-05-07|2022-03-16|The Joan and Irwin Jacobs Technion-Cornell Institute|Systems and methods for detection of anomalies in civil infrastructure using context aware semantic computer vision techniques|
DE102019214139B4|2019-09-17|2021-07-29|Atlas Elektronik Gmbh|Optical mine detection in shallow water|
KR102154249B1|2019-12-26|2020-09-09|엘아이지넥스원 주식회사|Method and Apparatus for Designing Control Timing for Mobile Radar Operation|
CN111429565B|2020-03-18|2021-04-06|中国民航科学技术研究院|System and method for acquiring and managing three-dimensional data on surface of airframe of civil aircraft|
CN112268548B|2020-12-14|2021-03-09|成都飞机工业(集团)有限责任公司|Airplane local appearance measuring method based on binocular vision|
法律状态:
2016-07-08| PLFP| Fee payment|Year of fee payment: 2 |
2016-12-16| PLSC| Publication of the preliminary search report|Effective date: 20161216 |
2017-05-10| PLFP| Fee payment|Year of fee payment: 3 |
2017-07-14| TP| Transmission of property|Owner name: DONECLE, FR Effective date: 20170613 |
2018-04-06| PLFP| Fee payment|Year of fee payment: 4 |
2020-04-06| PLFP| Fee payment|Year of fee payment: 6 |
2021-04-12| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
申请号 | 申请日 | 专利标题
FR1555452|2015-06-15|
FR1555452A|FR3037429B1|2015-06-15|2015-06-15|SYSTEM AND METHOD FOR AUTOMATIC SURFACE INSPECTION|FR1555452A| FR3037429B1|2015-06-15|2015-06-15|SYSTEM AND METHOD FOR AUTOMATIC SURFACE INSPECTION|
KR1020177036646A| KR20180029973A|2015-06-15|2016-06-15|Systems and methods for automatically inspecting surfaces|
CA2989154A| CA2989154A1|2015-06-15|2016-06-15|System and method for automatically inspecting surfaces|
SG11201709768YA| SG11201709768YA|2015-06-15|2016-06-15|System and method for automatically inspecting surfaces|
US15/736,952| US10377485B2|2015-06-15|2016-06-15|System and method for automatically inspecting surfaces|
CN201680034120.0A| CN107709158A|2015-06-15|2016-06-15|System and method for checking surface automatically|
JP2017564662A| JP2018521406A|2015-06-15|2016-06-15|System and method for automatically inspecting a surface|
PCT/FR2016/051448| WO2016203151A1|2015-06-15|2016-06-15|System and method for automatically inspecting surfaces|
BR112017026528A| BR112017026528A2|2015-06-15|2016-06-15|? system and method for automatic surface inspection?|
EP16741082.8A| EP3308232B1|2015-06-15|2016-06-15|System and method for automatically inspecting surfaces|
MA042512A| MA42512A|2015-06-15|2016-06-15|AUTOMATIC SURFACE INSPECTION SYSTEM AND PROCESS|
IL255814A| IL255814D0|2015-06-15|2017-11-21|System and method for automatically inspecting surfaces|
[返回顶部]